Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Deepseek-v3 distillation
# Deepseek-v3 distillation
Virtuoso Medium V2
Apache-2.0
A 32-billion-parameter language model based on Qwen-2.5-32B architecture, trained through Deepseek-v3 distillation, demonstrating excellent performance in multiple benchmarks.
Large Language Model
Transformers
V
arcee-ai
412
51
Featured Recommended AI Models
Empowering the Future, Your AI Solution Knowledge Base
English
简体中文
繁體中文
にほんご
© 2025
AIbase